Gaussian Process Latent Variable Models for Dimensionality Reduction and Time Series Modeling

نویسنده

  • Nakul Gopalan
چکیده

Time series data of high dimensions are frequently encountered in fields like robotics, computer vision, economics and motion capture. In this survey paper we look first at Gaussian Process Latent Variable Model (GPLVM) which is a probabilistic nonlinear dimensionality reduction method. Further we discuss Gaussian Process Dynamical Model (GPDMs) which are based GPLVM. GPDM is a probabilistic approach to model high dimensional time series data in a low dimensional latent space with a dynamical model. We also discuss variational approximations of GPLVM and Variational Gaussian Process Dynamical System (VGPDS) which is a dynamical model based on these variational approximations.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Variational Gaussian Process Dynamical Systems

High dimensional time series are endemic in applications of machine learning such as robotics (sensor data), computational biology (gene expression data), vision (video sequences) and graphics (motion capture data). Practical nonlinear probabilistic approaches to this data are required. In this paper we introduce the variational Gaussian process dynamical system. Our work builds on recent varia...

متن کامل

Shared Gaussian Process Latent Variables Models

A fundamental task is machine learning is modeling the relationship between different observation spaces. Dimensionality reduction is the task reducing the number of dimensions in a parameterization of a data-set. In this thesis we are interested in the cross-road between these two tasks: shared dimensionality reduction. Shared dimensionality reduction aims to represent multiple observation spa...

متن کامل

Bayesian Gaussian Process Latent Variable Model

We introduce a variational inference framework for training the Gaussian process latent variable model and thus performing Bayesian nonlinear dimensionality reduction. This method allows us to variationally integrate out the input variables of the Gaussian process and compute a lower bound on the exact marginal likelihood of the nonlinear latent variable model. The maximization of the variation...

متن کامل

Distributed Variational Inference in Sparse Gaussian Process Regression and Latent Variable Models

Gaussian processes (GPs) are a powerful tool for probabilistic inference over functions. They have been applied to both regression and non-linear dimensionality reduction, and offer desirable properties such as uncertainty estimates, robustness to over-fitting, and principled ways for tuning hyper-parameters. However the scalability of these models to big datasets remains an active topic of res...

متن کامل

Variational Inference for Uncertainty on the Inputs of Gaussian Process Models

The Gaussian process latent variable model (GP-LVM) provides a flexible approach for non-linear dimensionality reduction that has been widely applied. However, the current approach for training GP-LVMs is based on maximum likelihood, where the latent projection variables are maximized over rather than integrated out. In this paper we present a Bayesian method for training GP-LVMs by introducing...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2012